Strong Chain Rules for Min-Entropy under Few Bits Spoiled
نویسنده
چکیده
It is well established that the notion of min-entropy fails to satisfy the chain rule of the form H(X,Y ) = H(X|Y )+ H(Y ), known for Shannon Entropy. The lack of a chain rule causes a lot of technical difficulties, particularly in cryptography where the chain rule would be a natural way to analyze how min-entropy is split among smaller blocks. Such problems arise for example when constructing extractors and dispersers. We show that any sequence of variables exhibits a very strong strong block-source structure (conditional distributions of blocks are nearly flat) when we spoil few correlated bits. This implies, conditioned on the spoiled bits, that splitting-recombination properties hold. In particular, we have many nice properties that minentropy doesn’t obey in general, for example strong chain rules, “information can’t hurt” inequalities, equivalences of average and worst-case conditional entropy definitions and others. Quantitatively, for any sequence X1, . . . , Xt of random variables over an alphabet X we prove that, when conditioned on m = t · O(log log |X | + log log(1/ǫ) + log t) bits of auxiliary information, all conditional distributions of the form Xi|X<i are ǫ-close to be nearly flat (only a constant factor away). The argument is combinatorial (based on simplex coverings). This result may be used as a generic tool for exhibiting blocksource structures. We demonstrate this by reproving the fundamental converter due to Nisan and Zuckermann (J. Computer and System Sciences, 1996), which shows that sampling blocks from a min-entropy source roughly preserves the entropy rate. Our bound implies, only by straightforward chain rules, an additive loss of o(1) (for sufficiently many samples), which qualitatively meets the first tighter analysis of this problem due to Vadhan (CRYPTO’03), obtained by large deviation techniques. Keywords—chain rule, min-entropy, spoiling knowledge, block sources, local extractors
منابع مشابه
Pseudoentropy: Lower-Bounds for Chain Rules and Transformations
Computational notions of entropy have recently found many applications, including leakageresilient cryptography, deterministic encryption or memory delegation. The two main types of results which make computational notions so useful are (1) Chain rules, which quantify by how much the computational entropy of a variable decreases if conditioned on some other variable (2) Transformations, which q...
متن کاملQuantum-Proof Extractors: Optimal up to Constant Factors
We give the first construction of a family of quantum-proof extractors that has optimal seed length dependence O(log(n/ǫ)) on the input length n and error ǫ. Our extractors support any min-entropy k = Ω(log n+ log(1/ǫ)) and extract m = (1− α)k bits that are ǫ-close to uniform, for any desired constant α > 0. Previous constructions had a quadratically worse seed length or were restricted to very...
متن کاملTsallis Entropy and Conditional Tsallis Entropy of Fuzzy Partitions
The purpose of this study is to define the concepts of Tsallis entropy and conditional Tsallis entropy of fuzzy partitions and to obtain some results concerning this kind entropy. We show that the Tsallis entropy of fuzzy partitions has the subadditivity and concavity properties. We study this information measure under the refinement and zero mode subset relations. We check the chain rules for ...
متن کاملADK Entropy and ADK Entropy Rate in Irreducible- Aperiodic Markov Chain and Gaussian Processes
In this paper, the two parameter ADK entropy, as a generalized of Re'nyi entropy, is considered and some properties of it, are investigated. We will see that the ADK entropy for continuous random variables is invariant under a location and is not invariant under a scale transformation of the random variable. Furthermore, the joint ADK entropy, conditional ADK entropy, and chain rule of this ent...
متن کاملBitwise Quantum Min-Entropy Sampling and New Lower Bounds for Random Access Codes
Min-entropy sampling gives a bound on the min-entropy of a randomly chosen subset of a string, given a bound on the min-entropy of the whole string. König and Renner showed a min-entropy sampling theorem that holds relative to quantum knowledge. Their result achieves the optimal rate, but it can only be applied if the bits are sampled in block, and only gives weak bounds for non-smooth min-entr...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1702.08476 شماره
صفحات -
تاریخ انتشار 2017